707 Downloads
The large option of the Gemma 2 model family. Built by Google, using from the same research and technology used to create the Gemini models
Trained for Ttool use
Last Updated 9 days ago
Gemma 2 features the same extremely large vocabulary from release 1.1, which tends to help with multilingual and coding proficiency.
Gemma 2 27B was trained on a wide dataset of 13 trillion tokens, more than twice as many as Gemma 1.1, and an extra 60% over the 9B model, using similar datasets including:
For more details check out their blog post here: https://huggingface.co/blog/gemma2
The underlying model files this model uses
Based on
When you download this model, LM Studio picks the source that will best suit your machine (you can override this)
Custom configuration options included with this model